1,113 research outputs found

    What Happened to Risk Management During the 2008-09 Financial Crisis?

    Get PDF
    When dealing with market risk under the Basel II Accord, variation pays in the form of lower capital requirements and higher profits. Typically, GARCH type models are chosen to forecast Value-at-Risk (VaR) using a single risk model. In this paper we illustrate two useful variations to the standard mechanism for choosing forecasts, namely: (i) combining different forecast models for each period, such as a daily model that forecasts the supremum or infinum value for the VaR; (ii) alternatively, select a single model to forecast VaR, and then modify the daily forecast, depending on the recent history of violations under the Basel II Accord. We illustrate these points using the Standard and Poorñ€ℱs 500 Composite Index. In many cases we find significant decreases in the capital requirements, while incurring a number of violations that stays within the Basel II Accord limits.risk management;violations;conservative risk strategy;aggressive risk strategy;value-at-risk forecast

    GFC-Robust Risk Management Strategies under the Basel Accord

    Get PDF
    A risk management strategy is proposed as being robust to the Global Financial Crisis (GFC) by selecting a Value-at-Risk (VaR) forecast that combines the forecasts of different VaR models. The robust forecast is based on the median of the point VaR forecasts of a set of conditional volatility models. This risk management strategy is GFC-robust in the sense that maintaining the same risk management strategies before, during and after a financial crisis would lead to comparatively low daily capital charges and violation penalties. The new method is illustrated by using the S&P500 index before, during and after the 2008-09 global financial crisis. We investigate the performance of a variety of single and combined VaR forecasts in terms of daily capital requirements and violation penalties under the Basel II Accord, as well as other criteria. The median VaR risk management strategy is GFC-robust as it provides stable results across different periods relative to other VaR forecasting models. The new strategy based on combined forecasts of single models is straightforward to incorporate into existing computer software packages that are used by banks and other financial institutions.Value-at-Risk (VaR);daily capital charges;optimizing strategy;robust forecasts;violation penalties;global financial crisis;Basel II Accord;aggressive risk management strategy;conservative risk management strategy

    A decision rule to minimize daily capital charges in forecasting value-at-risk

    Get PDF
    Under the Basel II Accord, banks and other Authorized Deposit-taking Institutions (ADIs) have to communicate their daily risk estimates to the monetary authorities at the beginning of the trading day, using a variety of Value-at-Risk (VaR) models to measure risk. Sometimes the risk estimates communicated using these models are too high, thereby leading to large capital requirements and high capital costs. At other times, the risk estimates are too low, leading to excessive violations, so that realised losses are above the estimated risk. In this paper we propose a learning strategy that complements existing methods for calculating VaR and lowers daily capital requirements, while restricting the number of endogenous violations within the Basel II Accord penalty limits. We suggest a decision rule that responds to violations in a discrete and instantaneous manner, while adapting more slowly in periods of no violations. We apply the proposed strategy to Standard & Poorñ€ℱs 500 Index and show there can be substantial savings in daily capital charges, while restricting the number of violations to within the Basel II penalty limits.value-at-risk;daily capital charges;optimizing strategy;risk forecasts;endogenous violations;frequency of violations

    A Tool for Integer Homology Computation: Lambda-At Model

    Full text link
    In this paper, we formalize the notion of lambda-AT-model (where λ\lambda is a non-null integer) for a given chain complex, which allows the computation of homological information in the integer domain avoiding using the Smith Normal Form of the boundary matrices. We present an algorithm for computing such a model, obtaining Betti numbers, the prime numbers p involved in the invariant factors of the torsion subgroup of homology, the amount of invariant factors that are a power of p and a set of representative cycles of generators of homology mod p, for each p. Moreover, we establish the minimum valid lambda for such a construction, what cuts down the computational costs related to the torsion subgroup. The tools described here are useful to determine topological information of nD structured objects such as simplicial, cubical or simploidal complexes and are applicable to extract such an information from digital pictures.Comment: Journal Image and Vision Computing, Volume 27 Issue 7, June, 200

    Tell me your age and I tell you what you trust: the moderating effect of generations

    Get PDF
    Purpose: The proliferation of social commerce websites has allowed consumers to share and exchange information, experiences, advice and opinions. Recently, information provided by users has been considered more trustworthy than the information shared by companies. However, the way in which users interact with technology can vary with age, and generational cohorts show different shopping behaviors, interests and attitudes. Hence, the way users process information (user-generated vs company-generated) can affect trust differently. Drawing on the trust transfer theory and the generational cohort theory, the purpose of this paper is to analyze the effects on user- and company-generated information in boosting trust of three different cohorts (Generation X, Y and Z). Design/methodology/approach: The data were collected through an online survey. The sample comprised 715 users of social commerce websites, aged between 16 and 55 years old. The study was analyzed using partial least squares with the statistical software Smart PLS 3. Findings: The empirical results show that generational cohorts show different patterns. Generation X transfers trust to social commerce websites mainly from trust in information generated by companies, while Generation Z transfers trust mainly from information generated by users. Finally, Generation Y, in contrast to previous findings about millennials, develops trust based on company-generated information to an even greater extent than does Generation X. Originality/value: The originality of this study lies in its analysis of generational differences when it comes to trusting one type of information over another. This study contributes to the idea that users cannot be considered as a whole but must be segmented into generational cohorts

    Social commerce users'' optimal experience: stimuli, response and culture

    Get PDF
    Social commerce users' experience is generated during socio-commercial interactions. Therein, users receive utilitarian and hedonic stimuli that form their experience and influence their responses. However, research is needed to understand how this experience is generated. Based on the stimulus-organism-response framework and flow theory, this study analyzes how hedonic stimulus (here called sPassion) and utilitarian stimulus (usability) affect users' flow experience (organism) to positively impact emotional and behavioral loyalty (users' responses). Furthermore, as social commerce users are culturally diverse, the moderating effect of cultural background is studied, drawing on Hofstede's cultural dimensions. Findings show that hedonic stimulus more strongly impacts social commerce users' flow experience versus utilitarian stimulus. Once users reach the state of optimal experience, their positive responses are reflected in their increased intention to spread social word of mouth, to return to the website and to repurchase on it. Additionally, optimal user experience in social commerce is generated mainly through hedonic stimuli and, while social commerce environments can be culturally diverse, cultural background does not imply changes in users' behavioral patterns. This study theoretically advances research on social commerce users' experience. Likewise, the findings guide online retailers in optimizing user experience via hedonic stimuli to enhance loyalty

    Implementation of an extended ZINB model in the study of low levels of natural gastrointestinal nematode infections in adult sheep

    Get PDF
    Background: In this study, two traits related with resistance to gastrointestinal nematodes (GIN) were measured in 529 adult sheep: faecal egg count (FEC) and activity of immunoglobulin A in plasma (IgA). In dry years, FEC can be very low in semi-extensive systems, such as the one studied here, which makes identifying animals that are resistant or susceptible to infection a difficult task. A zero inflated negative binomial model (ZINB) model was used to calculate the extent of zero inflation for FEC; the model was extended to include information from the IgA responses. Results: In this dataset, 64 % of animals had zero FEC while the ZINB model suggested that 38 % of sheep had not been recently infected with GIN. Therefore 26 % of sheep were predicted to be infected animals with egg counts that were zero or below the detection limit and likely to be relatively resistant to nematode infection. IgA activities of all animals were then used to decide which of the sheep with zero egg counts had been exposed and which sheep had not been recently exposed. Animals with zero FEC and high IgA activity were considered resistant while animals with zero FEC and low IgA activity were considered as not recently infected. For the animals considered as exposed to the infection, the correlations among the studied traits were estimated, and the influence of these traits on the discrimination between unexposed and infected animals was assessed. Conclusions: The model presented here improved the detection of infected animals with zero FEC. The correlations calculated here will be useful in the development of a reliable index of GIN resistance that could be of assistance for the study of host resistance in studies based on natural infection, especially in adult sheep, and also the design of breeding programs aimed at increasing resistance to parasites

    From sPassion to sWOM: The role of flow

    Get PDF
    Purpose - Social commerce websites entail a completely new scenario for sharing experiences and opinions due to its richness in terms of social interactions. Nowadays, users can interact with the company and with other users; hence, it seems important to study how social stimuli affect users. Drawing on the stimulus-organism-response framework and flow theory, the purpose of this paper is to propose that the social stimulus (named social passion (sPassion)) has a positive effect on the organism (state of flow), which leads to a users' positive response (via social word of mouth (sWOM)). Design/methodology/approach - The data were collected through an online survey in 2015. The sample consists of 771 users of social commerce websites, of which 51 percent are male and 49 percent female, aged between 16 and 80 years old. Structural equation modeling was used to analyze the data with the statistical software SPSS version 22 and EQS 6. Findings - The empirical results confirm that passionate users are prone to experience a state of flow and, as a consequence, share positive sWOM. Originality/value - This study contributes to the literature on customers' online participation, and the findings are hoped to help companies in developing social commerce websites that boost users' exchange of information

    Automatic design of deterministic and non-halting membrane systems by tuning syntactical ingredients

    Get PDF
    To solve the programmability issue of membrane computing models, the automatic design of membrane systems is a newly initiated and promising research direction. In this paper, we propose an automatic design method, Permutation Penalty Genetic Algorithm (PPGA), for a deterministic and non-halting membrane system by tuning membrane structures, initial objects and evolution rules. The main ideas of PPGA are the introduction of the permutation encoding technique for a membrane system, a penalty function evaluation approach for a candidate membrane system and a genetic algorithm for evolving a population of membrane systems toward a successful one fulfilling a given computational task. Experimental results show that PPGA can successfully accomplish the automatic design of a cell-like membrane system for computing the square of n ( n >/= 1 is a natural number) and can find the minimal membrane systems with respect to their membrane structures, alphabet, initial objects, and evolution rules for fulfilling the given task. We also provide the guidelines on how to set the parameters of PPGA

    Macroseismic estimation of earthquake parameters

    Get PDF
    The derivation of earthquake parameters from macroseismic (intensity) data is an inveterate problem. Yet for earthquakes in the pre-instrumental period (roughly, before 1900) intensity data points (IDPs) are the only form of numerical data available to the seismologist. In order to produce a numerate, consistent catalogue of historical earthquakes that can be combined in a compatible way with modern instrumental data requires some system for estimating what instrumental parameters would have been obtained had seismometers been in operation. Successive catalogue authors have had to deal with this problem as they saw fit; but as most earthquake catalogues have been compiled as national initiatives, one finds that one type of method has been used in one country, something else in another, and so on. This leads to obvious problems of inconsistency when it comes to studies that need to transcend national borders. A major aim of the NA-4 module of the European Framework project NERIES is to produce a catalogue of European earthquakes before 1900 in which there is the greatest possible level of internal consistency in the determination of earthquake parameters. This means the use of uniform procedures for determining earthquake parameters over the whole of Europe. Finding suitable procedures that can be used for this is a difficult task, and is the subject of this report. The parameters to be determined are essentially the location and the size of each earthquake. Precisely how one defines location in this context is arguable – one speaks of the “macroseismic epicentre”, but this is not necessarily exactly the same as an epicentre in the sense of the surface projection of the point where an earthquake rupture initiates. Where an earthquake rupture is large, while the distribution of high intensities may delineate its extent, there is no possibility to determine the initiating point – and probably not a lot of interest in doing so either. The co-ordinates that will be used will be those of the centre point of the rupture; something approximating to the focus from which the seismic energy radiated. For this reason, the term barycentre is sometimes preferred (Cecić et al 1996). From the point of view of seismic hazard, arguably such a point is of greater interest, in terms of reconstructing the seismic field. Since earthquakes occur in three dimensions, as well as the latitude and longitude co-ordinates, one needs also some sort of depth of focus. This is evidently more meaningful for smaller earthquakes of limited rupture dimensions. “Size” has to be considered here to mean “magnitude”; whereas many earlier historical earthquake catalogues were content to use epicentral intensity, Io, as a size measure, for modern applications, and for consistency with modern data sets, this is not enough, and magnitude, preferably moment magnitude, Mw, has to be estimated
    • 

    corecore